Some highlights on Source-to-Source Adjoint AD

نویسنده

  • Laurent Hascoët
چکیده

Algorithmic Differentiation (AD) provides the analytic derivatives of functions given as programs. Adjoint AD, which computes gradients, is similar to Back Propagation for Machine Learning. AD researchers study strategies to overcome the difficulties of adjoint AD, to get closer to its theoretical efficiency. To promote fruitful exchanges between Back Propagation and adjoint AD, we present three of these strategies and give our view of their interest and current status.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Interfacing OpenAD and Tapenade

Development of a capable algorithmic differentiation (AD) tool requires large developer effort to provide the various flavors of derivatives, to experiment with the many AD model variants, and to apply them to the candidate application languages. Considering the relatively small size of the academic teams that develop AD tools, collaboration between them is a natural idea. This collaboration ca...

متن کامل

Adjoint Algorithmic Differentiation Tool Support for Typical Numerical Patterns in Computational Finance

We demonstrate the flexibility and ease of use of C++ algorithmic differentiation (AD) tools based on overloading to numerical patterns (kernels) arising in computational finance. While adjoint methods and AD have been known in the finance literature for some time, there are few tools capable of handling and integrating with the C++ codes found in production. Adjoint methods are also known to b...

متن کامل

Discretized Adjoint State Time and Frequency Domain Full Waveform Inversion: A Comparative Study

This study derives the discretized adjoint states full waveform inversion (FWI) in both time and frequency domains based on the Lagrange multiplier method. To achieve this, we applied adjoint state inversion on the discretized wave equation in both time domain and frequency domain. Besides, in this article, we introduce reliability tests to show that the inversion is performing as it should be ...

متن کامل

Generating Recomputations in Reverse Mode Ad

The main challenge of the reverse (or adjoint) mode of automatic diierentiation (AD) is providing the accurate values of required variables to the derivative code. We discuss diierent strategies to tackle this challenge. The ability to generate eecient adjoint code is crucial for handling large scale applications. For challenging applications, eecient ad-joint code must provide at least a fract...

متن کامل

"To be recorded" analysis in reverse-mode automatic differentiation

The automatic generation of adjoints of mathematical models that are implemented as computer programs is receiving increased attention in the scientific and engineering communities. Reverse-mode automatic differentiation is of particular interest for large-scale optimization problems. It allows the computation of gradients at a small constant multiple of the cost for evaluating the objective fu...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017